In telecommunication, a burst error or error burst is a contiguous sequence of symbols, received over a data transmission channel, such that the first and last symbols are in error and there exists no contiguous subsequence of m correctly received symbols within the error burst.[1]
The integer parameter m is referred to as the guard band of the error burst. The last symbol in a burst and the first symbol in the following burst are accordingly separated by m correct bits or more. The parameter m should be specified when describing an error burst.
The length of a burst of bit errors in a frame is defined as the number of bits from the first error to the last, inclusive.
The Gilbert–Elliott model is a simple channel model introduced by Edgar Gilbert[2] and E. O. Elliott [3] widely used for describing burst error patterns in transmission channels, that enables simulations of the digital error performance of communications links. It is based on a Markov chain with two states G (for good or gap) and B (for bad or burst). In state G the probability of transmitting a bit correctly is k and in state B it is h. Usually[4], it is assumed that k = 1 and Gilbert also assumed that h = 0.5.
This article incorporates public domain material from the General Services Administration document "Federal Standard 1037C" (in support of MIL-STD-188).